# Dual Model Fusion
Archaeo 12B GGUF
Archaeo-12B is a 12B-parameter large language model specifically designed for role-playing and creative writing, obtained by merging the Rei-12B and Francois-Huali-12B models
Large Language Model
A
Delta-Vector
125
12
Copus 2x8B
Copus-2x8B is a Mixture of Experts model based on the Llama-3-8B architecture, combining fine-tuned versions of dreamgen/opus-v1.2-llama-3-8b and NousResearch/Meta-Llama-3-8B-Instruct.
Large Language Model
Transformers

C
lodrick-the-lafted
14
1
Deepmagic Coder 7b Alt
Other
DeepMagic-Coder-7b is a merged version of the DeepSeek-Coder and Magicoder models, focusing on code generation and programming tasks.
Large Language Model
Transformers

D
rombodawg
61
7
Synatra 7B V0.3 RP Mistral 7B Instruct V0.2 Slerp
Apache-2.0
This model is created by spherical linear interpolation (slerp) fusion between Mistral-7B instruction-tuned version and Synatra-7B role-playing version, combining both instruction understanding and role-playing capabilities
Large Language Model
Transformers

S
MaziyarPanahi
25
1
Featured Recommended AI Models